Error Bounds for Real Function Classes Based on Discretized Vapnik-Chervonenkis Dimensions

نویسندگان

  • Chao Zhang
  • Dacheng Tao
چکیده

The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning theory. In this paper, we propose the discretized VC dimension obtained by discretizing the range of a real function class. Then, we point out that Sauer’s Lemma is valid for the discretized VC dimension. We group the real function classes having the infinite VC dimension into four categories by using the discretized VC dimension. As a byproduct, we present the equidistantly discretized VC dimension by introducing an equidistant partition to segmenting the range of a real function class. Finally, we obtain the error bounds for real function classes based on the discretized VC dimensions in the PAC-learning framework.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Quantifying Generalization in Linearly Weighted Neural Networks

Abst ract . Th e Vapn ik-Chervonenkis dimension has proven to be of great use in the theoret ical study of generalizat ion in artificial neural networks. Th e "probably approximately correct" learning framework is described and the importance of the Vapnik-Chervonenkis dimension is illustrated. We then investigate the Vapnik-Chervonenkis dimension of certain types of linearly weighted neural ne...

متن کامل

Sign rank versus Vapnik-Chervonenkis dimension

This work studies the maximum possible sign rank of sign (N ×N)-matrices with a given Vapnik-Chervonenkis dimension d. For d = 1, this maximum is three. For d = 2, this maximum is Θ̃(N). For d > 2, similar but slightly less accurate statements hold. The lower bounds improve on previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constr...

متن کامل

Generalization Bounds for Connected Function Classes

We derive an improvement to the Vapnik-Chervonenkis bounds on generalization error which applies to many commonly used function classes, including feedforward neural networks. The VC analysis simply counts the number of dichotomies induced by a function class on a set of input vectors. We can achieve a richer description of the exibility of a function class by taking into account the diversity ...

متن کامل

Making Vapnik-Chervonenkis bounds accurate

This chapter shows how returning to the combinatorial nature of the Vapnik-Chervonenkis bounds provides simple ways to increase their accuracy, take into account properties of the data and of the learning algorithm, and provide empirically accurate estimates of the deviation between training error and testing error.

متن کامل

Automatic Pattern Recognition: A Study of the Probability of Error

A test sequence is used to select the best rule from a rich class of discrimination rules defined in terms of the training sequence . The Vapnik-Chervonenkis and related inequalities are used to obtain distribution-free bounds on the difference between the probability of error o€ the selected rule and the probability of error of the best rule in the given class . The bounds are used to prove th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Austr. J. Intelligent Information Processing Systems

دوره 12  شماره 

صفحات  -

تاریخ انتشار 2010